67 research outputs found

    Quasidifferentiable Functions: Necessary Conditions and Descent Directions

    Get PDF
    In this paper the author studies the necessary conditions for an extremum when either the function to be optimized or the function describing the set on which optimization must be carried out is nondifferentiable. The author's main concern is with quasidifferentiable functions but smooth and convex cases are also discussed

    Directional Differentiability of a Continual Maximum Function of Quasidifferentiable Functions

    Get PDF
    Much recent work in optimization theory has been concerned with the problems caused by nondifferentiability. Some of these problems have now been at least partially overcome by the definition of a new class of nondifferentiable functions called quasidifferentiable functions, and the extension of classical differential calculus to deal with this class of functions. This has led to increased theoretical research in the properties of quasidifferentiable functions and their behavior under different conditions. In this paper, the problem of the directional differentiability of a maximum function over a continual set of quasidifferentiable functions is discussed. It is shown that, in general, the operation of taking the "continual" maximum (or minimum) leads to a function which is itself not necessarily quasidifferentiable

    A Directional Implicit Theorem for Quasidifferentiable Functions

    Get PDF
    The implicit and inverse function theorems of classical differential calculus represent an essential element in the structure of the calculus. In this paper the authors consider problems related to deriving analogous theorems in quasidifferential calculus

    Nonsmoothness and Quasidifferentiability

    Get PDF
    This paper presents a survey of results related to quasidifferential calculus. First we discuss different classes of directionally differentiable functions (convex functions, maximum functions and quasidifferentiable functions). Several generalizations of the concept of a subdifferential are considered, and the place and role of quasidifferentiable functions are outlined

    An Algorithm for Minimizing a Certain Class of Quasidifferentiable Functions

    Get PDF
    One interesting and important class of nondifferentiable functions is that produced by smooth compositions of max-type functions. Such functions are of practical value and have been studied extensively by several researchers. We treat them as quasidifferentiable functions and analyze them using quasidifferential calculus. One special subgroup of this class of functions (namely, the sum of a max-type function and a min-type function) has been studied by T.I. Sivelina. The main feature of the algorithm described in the present paper is that at each step it is necessary to consider a bundle of auxiliary directions and points, of which only one can be chosen for the next step. This requirement seems to arise from the intrinsic nature of nondifferentiable functions

    Nondifferentiable Optimization: Motivations and Applications

    Get PDF
    IIASA has been involved in research on nondifferentiable optimization since 1976. The Institute's research in this field has been very productive, leading to many important theoretical, algorithmic and applied results. Nondifferentiable optimization has now become a recognized and rapidly developing branch of mathematical programming. To continue this tradition and to review developments in this field IIASA held this Workshop in Sopron (Hungary) in September 1984. This volume contains selected papers presented at the Workshop. It is divided into four sections dealing with the following topics: (I) Concepts in Nonsmooth Analysis; (II) Multicriteria Optimization and Control Theory; (III) Algorithms and Optimization Methods; (IV) Stochastic Programming and Applications

    Parallelization of the discrete gradient method of non-smooth optimization and its applications

    Full text link
    We investigate parallelization and performance of the discrete gradient method of nonsmooth optimization. This derivative free method is shown to be an effective optimization tool, able to skip many shallow local minima of nonconvex nondifferentiable objective functions. Although this is a sequential iterative method, we were able to parallelize critical steps of the algorithm, and this lead to a significant improvement in performance on multiprocessor computer clusters. We applied this method to a difficult polyatomic clusters problem in computational chemistry, and found this method to outperform other algorithms. <br /
    • …
    corecore